Merged
Conversation
Also removed -ai -ask -followup combo requirements in leu of using -chat. This launches a BubbleTea TUI Chat Session that has an LLM and Prompt preconfigured with the context of the new summary it just created. The chat log is then saved to the kOutputDir destination. Example usage includes: summarize -i "go,sh" -chat
Co-authored-by: Copilot <[email protected]>
Co-authored-by: Copilot <[email protected]>
Co-authored-by: Copilot <[email protected]>
Contributor
There was a problem hiding this comment.
Pull Request Overview
This PR implements AI Chat Mode functionality for the summarize tool, allowing users to interact with an LLM-powered chat interface loaded with the context of their workspace summary. The implementation adds a -chat flag that opens a BubbleTea TUI chat session and integrates with various AI providers including Ollama.
Key Changes:
- Added AI chat functionality with configurable providers, models, and settings
- Implemented BubbleTea TUI for interactive chat interface
- Refactored codebase by extracting functionality into separate modules for better organization
Reviewed Changes
Copilot reviewed 16 out of 17 changed files in this pull request and generated 9 comments.
Show a summary per file
| File | Description |
|---|---|
| main.go | Major refactoring to extract functions and integrate chat functionality |
| chat.go | New BubbleTea TUI implementation for AI chat interface |
| ai.go | AI provider configuration and initialization logic |
| configure.go | Extracted configuration setup with new AI-related settings |
| const.go | Centralized constants including AI configuration keys |
| var.go, var_funcs.go | Extracted variable definitions and helper functions |
| type.go, type_funcs.go | Extracted type definitions and methods |
| version.go | Extracted version handling logic |
| env.go | Environment variable handling utilities |
| reflect.go | File reflection utilities for chat logs |
| simplify.go | String deduplication utility |
| gz.go | Compression/decompression utilities |
| go.mod | Added dependencies for BubbleTea UI and AI integration |
| VERSION | Version bump to v1.1.0 |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Now with the
-chatoption, you can open a BubbleTea TUI chat session that is loaded with a context of your summarized workspace that is about to be saved to disk or printed to screen, depending on your other options you tag onto with-chat. Example usage is:summarize -i "go,mod" -chatThis allows you to interact with your LLM of choice in order to chat with your summarized workspace and an LLM agent of your choice. Tested with Ollama.